On Boosting and the Exponential Loss
نویسنده
چکیده
منابع مشابه
SPLBoost: An Improved Robust Boosting Algorithm Based on Self-paced Learning
It is known that Boosting can be interpreted as a gradient descent technique to minimize an underlying loss function. Specifically, the underlying loss being minimized by the traditional AdaBoost is the exponential loss, which is proved to be very sensitive to random noise/outliers. Therefore, several Boosting algorithms, e.g., LogitBoost and SavageBoost, have been proposed to improve the robus...
متن کاملSmooth ε-Insensitive Regression by Loss Symmetrization
We describe a framework for solving regression problems by reduction to classification. Our reduction is based on symmetrization of margin-based loss functions commonly used in boosting algorithms, namely, the logistic-loss and the exponential-loss. Our construction yields a smooth version of the ε-insensitive hinge loss that is used in support vector regression. Furthermore, this construction ...
متن کاملMulti-class Boosting
This paper briefly surveys existing methods for boosting multi-class classication algorithms, as well as compares the performance of one such implementation, Stagewise Additive Modeling using a Multi-class Exponential loss function (SAMME), against that of Softmax Regression, Classification and Regression Trees, and Neural Networks.
متن کاملBoosting and Maximum Likelihood for Exponential Models
We derive an equivalence between AdaBoost and the dual of a convex optimization problem, showing that the only difference between minimizing the exponential loss used by AdaBoost and maximum likelihood for exponential models is that the latter requires the model to be normalized to form a conditional probability distribution over labels. In addition to establishing a simple and easily understoo...
متن کاملGeneralized Boosted Models: A guide to the gbm package
Boosting takes on various forms with different programs using different loss functions, different base models, and different optimization schemes. The gbm package takes the approach described in [2] and [3]. Some of the terminology differs, mostly due to an effort to cast boosting terms into more standard statistical terminology (e.g. deviance). In addition, the gbm package implements boosting ...
متن کامل